993 resultados para Research labs


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Creative productivity emerges from human interactions (Hartley, 2009, p. 214). In an era when life is lived in rather than with media (Deuze, this issue), this productivity is widely distributed among ephemeral social networks mediated through the internet. Understanding the underlying dynamics of these networks of human interaction is an exciting and challenging task that requires us to come up with new ways of thinking and theorizing. For example, inducting theory from case studies that are designed to show the exceptional dynamics present within single settings can be augmented today by largescale data generation and collections that provide new analytic opportunities to research the diversity and complexity of human interaction. Large-scale data generation and collection is occurring across a wide range of individuals and organisations. This offers a massive field of analysis which internet companies and research labs in particular are keen on exploring. Lazer et al (2009: 721) argue that such analytic potential is transformational for many if not most research fields but that the use of such valuable data must neither remain confined to private companies and government agencies nor to a privileged set of academic researchers whose studies cannot be replicated nor critiqued. In fact, the analytic capacity to have data of such unprecedented scope and scale available not only requires us to analyse what is and could be done with it and by whom (1) but also what it is doing to us, our cultures and societies (2). Part (1) of such analysis is interested in dependencies and their implications. Part (2) of the enquiry embeds part (1) in a larger context that analyses the long-term, complex dynamics of networked human interaction. From the latter perspective we can treat specific phenomena and the methods used to analyse them as moments of evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is not about the details of yet another robot control system, but rather the issues surrounding realworld robotic implementation. It is a fact that in order to realise a future where robots co-exist with people in everyday places, we have to pass through a developmental phase that involves some risk. Putting a “Keep Out, Experiment in Progress” sign on the door is no longer possible since we are now at a level of capability that requires testing over long periods of time in complex realistic environments that contain people. We all know that controlling the risk is important – a serious accident could set the field back globally – but just as important is convincing others that the risks are known and controlled. In this article, we describe our experience going down this path and we show that mobile robotics research health and safety assessment is still unexplored territory in universities and is often ignored. We hope that the article will make robotics research labs in universities around the world take note of these issues rather than operating under the radar to prevent any catastrophic accidents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O crescimento da população mundial, aumento da industrialização e consumo de bens e serviços, tem aumentado significativamente a geração de resíduos que vem causando impactos negativos na saúde humana e ambiental. Neste contexto, se destaca a geração de produtos perigosos, tais como, os resíduos de serviços de saúde- RSS. Por apresentarem riscos à saúde da população e do meio ambiente, recomendações, normas e legislações surgiram para orientar a melhor maneira o manejo e disposição final destes resíduos. No Brasil, as resoluções NBR 306/04 e CONAMA 358/05 dão diretrizes para a elaboração de um Plano de Gerenciamento de Resíduos de Serviços de Saúde-PGRSS. Os laboratórios de pesquisa e ensino, como geradores de RSS, precisam se adequar à legislação, porém existem poucos estudos e a legislação não aborda especificamente os resíduos destes laboratórios. Os laboratórios e unidades da UERJ, geradores de RSS, não possuem PGRSS. Na UERJ, somente dois estudos levantaram os resíduos gerados em laboratórios, entretanto os dados levantados para o Instituto de Biologia são incompletos. Este estudo buscou avaliar o manejo dos resíduos biológico, químico, radioativo e perfurocortante nos laboratórios do Instituto de Biologia. Os dados foram coletados pelas informações dadas pelos professores, funcionários ou alunos dos laboratórios e por observação direta. Os dados de manejo foram analisados de acordo com a RDC 306/04 Anvisa, da Resolução CONAMA 358/05 e das fichas de segurança dos produtos químicos. Foram estudados 83% dos laboratórios do Instituto de Biologia. Destes, 43% geram resíduos químicos. Dos laboratórios caracterizados, 19 laboratórios geram somente resíduo químico. No pavilhão Américo Piquet estão localizados 63% dos laboratórios geradores de resíduos biológicos, químicos, perfurocortantes ou radioativos. Do total de resíduos gerados nos laboratórios, cerca de 80% foi de resíduo biológico, 15% de resíduo químico e 5% de resíduo perfurocortante. O manejo dos resíduos nos laboratórios é realizado de maneira confusa, geralmente os erros estão na segregação, identificação e acondicionamento. De maneira geral, as informações sobre o manejo utilizado para os resíduos são incompletas, desconhecidas ou imprecisas. As ações incorretas do manejo de resíduos são características para cada tipo de resíduo; no resíduo biológico, freqüentemente, encontraram-se resíduos comuns. O resíduo químico é geralmente descartado sem tratamento prévio na rede de esgoto. O resíduo radioativo não possui identificação e acompanhamento do decaimento, para posterior descarte. No resíduo perfurocortante encontrou-se, freqüentemente, resíduo biológico e químico misturados. Para o sucesso de um futuro Plano de Gerenciamento de Resíduos, a capacitação dos profissionais é muito importante. A Instituição deve investir na consolidação desse trabalho, considerando que ela não pode se furtar de adotar uma postura pró-ativa com relação aos problemas ambientais, sejam eles dirigentes da instituição, ou profissionais que ali atuam. Espera-se que essa pesquisa possa auxiliar neste sentido.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta dissertação foi desenvolvida no Instituto Nacional de Infectologia Evandro Chagas (INI) da Fundação Oswaldo Cruz (Fiocruz), uma organização pública multipropósito de saúde com atividades de pesquisa, assistência e ensino, que adotou a estratégia de inovar a sua estrutura organizacional, com vistas a garantir a qualidade dos serviços prestados à população, bem como reforçar a orientação empreendedora das suas ações integradas. Sob o ponto de vista da pesquisa em Ciências Contábeis, a reestruturação do INI constitui um exemplo potencial da acumulação de ativo intangível pela organização e este foi o fato gerador do problema da pesquisa: a aplicação do método DEA para avaliar a melhoria da eficiência dos oito laboratórios de pesquisa multipropósito do INI, está associada ao referencial teórico que justifica a inovação organizacional implantada no Instituto após proposta nos Congressos Internos da Fiocruz? O objetivo geral foi analisar os resultados dos escore-sínteses do Data Envelopment Analysis (DEA), como medida dos ativos intangíveis, associando-os aos efeitos provenientes da implantação de mudanças organizacionais estratégicas, caracterizadas como inovações organizacionais ocorridas nos oito laboratórios de pesquisa clínica multipropósitos, isto é, que contemplam ensino, pesquisa e assistência, no INI da Fiocruz. O método consistiu de quatro etapas. Na primeira, foi realizada a análise da literatura sobre ativos intangíveis; inovação organizacional; estruturas da organização e modelo de análise de eficiência em organizações. Na segunda, foi realizada a coleta dos indicadores qualitativos referentes a mudança da estrutura organizacional de oito laboratórios do INI por meio de análise documental dos Congressos Internos da Fiocruz e uma pesquisa de opinião dos representantes dos laboratórios; quantitativamente, foram levantados dados para calcular o indicador de eficiência de cada um dos laboratórios. Na terceira etapa foi realizada a análise dos dados coletados, do período de 2006-2012, utilizando os indicadores calculados para associar a eficiência do conjunto destas atividades antes e depois da inovação organizacional associada à adoção de uma estrutura inovadora na reestruturação. Finalmente, a quarta etapa apresentou os resultados e as respectivas considerações sobre a pesquisa. Como contribuição, apresenta-se uma associação entre a inovação organizacional, decorrente da reestruturação dos oito laboratórios de pesquisa clínica e os resultados do método empírico que utiliza o DEA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Simultaneous non-invasive visualization of blood vessels and nerves in patients can be obtained in the eye. The retinal vasculature is a target of many retinopathies. Inflammation, readily manifest by leukocyte adhesion to the endothelial lining, is a key pathophysiological mechanism of many retinopathies, making it a valuable and ubiquitous target for disease research. Leukocyte fluorography has been extensively used in the past twenty years; however, fluorescent markers, visualization techniques, and recording methods have differed between studies. The lack of detailed protocol papers regarding leukocyte fluorography, coupled with lack of uniformity between studies, has led to a paucity of standards for leukocyte transit (velocity, adherence, extravasation) in the retina. Here, we give a detailed description of a convenient method using acridine orange (AO) and a commercially available scanning laser ophthalmoscope (SLO, HRA-OCT Spectralis) to view leukocyte behavior in the mouse retina. Normal mice are compared to mice with acute and chronic inflammation. This method can be readily adopted in many research labs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vertical line arrays (VLA) are a widely used apparatus in underwater acoustics with applications in sonar prediction, underwater communications and acoustic tomography, among others. Recent developments in digital electronics and communications allow for off-the-shelf development of VLA systems, with a large number of embedded acoustic and non-acoustic sensors able to fulfill application requirements, as opposed to single or few receiver configurations available until only a few years ago. Very often, the flexibility in water column sampling is achieved by splitting the VLA into modules that can be assembled according to the application. Such systems can be deployed and recovered from small vessels with a shorthanded crew, and make it possible for research labs with reduced budgets and operational means (ships and manpower) to gain control over the whole development process, from data acquisition to post-processing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Un nombre significatif d’enfants autistes ont une macrocéphalie. Malgré plusieurs études du périmètre crânien en autisme, peu d’études ont été faites sur des adultes. Aussi, les références actuelles en périmètre crânien (PC) adulte datent d’environ 20 ans. Les objectifs de cette étude étaient de construire une échelle de référence du PC adulte, et de comparer les taux de macrocéphalie entre un groupe d’adultes autistes et un groupe d’adultes neurotypiques. Dans cette étude, 221 sujets masculins adultes étaient recrutés de différents milieux afin de déterminer le meilleur modèle prédictif du PC et de construire l’échelle de référence. La hauteur et le poids étaient mesurés pour chaque participant afin de déterminer leur influence sur les dimensions crâniennes. Pour la partie comparative, 30 autistes et 36 sujets neurotypiques, tous adultes, étaient recrutés à partir de la base de données du laboratoire de recherche. Pour l’échelle de référence, les résultats démontraient des corrélations positives entre le PC avec la hauteur et le poids. Après analyse, la corrélation conjointe de la hauteur et du poids sur le PC a été déterminée comme étant le modèle qui offre les résultats les plus significatifs dans la prédiction du PC. Pour la partie comparative, les taux de macrocéphalie atteignaient 10,00% chez les autistes contre 2,56% chez les neurotypiques selon la formule de régression linéaire obtenue du modèle. Cependant le test d’exactitude de Fisher n’a révélé aucune différence significative entre les 2 groupes. Mes résultats suggèrent qu’il est nécessaire de considérer la hauteur et le poids en construisant une référence du PC et que, même en utilisant la nouvelle référence, les taux de macrocéphalie demeurent plus élevés chez les autistes adultes que chez les adultes neurotypiques en dépit de l’absence de différences significatives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An introductory lecture on Web Science, taking a kind of devils advocate position by suggesting that the Web is a piece of runaway technology that escaped from research labs prematurely.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the new discoveries of oil and gas, the exploration of fields in various geological basins, imports of other oils and the development of alternative fuels, more and more research labs have evaluated and characterized new types of petroleum and derivatives. Therefore the investment in new techniques and equipment in the samples analysis to determine their physical and chemical properties, their composition, possible contaminants, especification of products, among others, have multiplied in last years, so development of techniques for rapid and efficient characterization is extremely important for a better economic recovery of oil. Based on this context, this work has two main objectives. The first one is to characterize the oil by thermogravimetry coupled with mass spectrometry (TG-MS), and correlate these results with from other types of characterizations data previously informed. The second is to use the technique to develop a methodology to obtain the curve of evaluation of hydrogen sulfide gas in oil. Thus, four samples were analyzed by TG-MS, and X-ray fluorescence spectrometry (XRF). TG results can be used to indicate the nature of oil, its tendency in coke formation, temperatures of distillation and cracking, and other features. It was observed in MS evaluations the behavior of oil main compounds with temperature, the points where the volatilized certain fractions and the evaluation gas analysis of sulfide hydrogen that is compared with the evaluation curve obtained by Petrobras with another methodology

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The European Union set the ambitious target of reducing energy consumption by 20% within 2020. This goal demands a tremendous change in how we generate and consume energy and urgently calls for an aggressive policy on energy efficiency. Since 19% of the European electrical energy is used for lighting, considerable savings can be achieved with the development of novel and more efficient lighting systems. In this thesis, accomplished in the frame of the EU project CELLO, I report some selected goals we achieved attempting to develop highly efficient, flat, low cost and flexible light sources using Light-Emitting Electrochemical Cells (LECs), based on ionic cyclometalated iridium(III) complexes. After an extensive introduction about LECs and solid-state lighting in general, I focus on the research we carried out on cyclometalated iridium(III) complexes displaying deep-blue emission, which has turned out to be a rather challenging task. In order to demonstrate the wide versatility of this class of compounds, I also report a case in which some tailored iridium(III) complexes act as near-infrared (NIR) sources. In fact, standard NIR emitting devices are typically expensive and, also in this case, LECs could serve as low-cost alternatives in fields were NIR luminescence is crucial, such as telecommunications and bioimaging. Since LECs are based on only one active material, in the last chapter I stress the importance of an integrated approach toward the right selection of suitable emitters not only from the photophysical, but also from the point of view of material science. An iridium(III) complex, once in the device, is interacting with ionic liquids, metal cathodes, electric fields, etc. All these interactions should be taken in to account if Europe really wants to implement more efficient lighting paradigms, generating light beyond research labs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The microarray technology provides a high-throughput technique to study gene expression. Microarrays can help us diagnose different types of cancers, understand biological processes, assess host responses to drugs and pathogens, find markers for specific diseases, and much more. Microarray experiments generate large amounts of data. Thus, effective data processing and analysis are critical for making reliable inferences from the data. ^ The first part of dissertation addresses the problem of finding an optimal set of genes (biomarkers) to classify a set of samples as diseased or normal. Three statistical gene selection methods (GS, GS-NR, and GS-PCA) were developed to identify a set of genes that best differentiate between samples. A comparative study on different classification tools was performed and the best combinations of gene selection and classifiers for multi-class cancer classification were identified. For most of the benchmarking cancer data sets, the gene selection method proposed in this dissertation, GS, outperformed other gene selection methods. The classifiers based on Random Forests, neural network ensembles, and K-nearest neighbor (KNN) showed consistently god performance. A striking commonality among these classifiers is that they all use a committee-based approach, suggesting that ensemble classification methods are superior. ^ The same biological problem may be studied at different research labs and/or performed using different lab protocols or samples. In such situations, it is important to combine results from these efforts. The second part of the dissertation addresses the problem of pooling the results from different independent experiments to obtain improved results. Four statistical pooling techniques (Fisher inverse chi-square method, Logit method. Stouffer's Z transform method, and Liptak-Stouffer weighted Z-method) were investigated in this dissertation. These pooling techniques were applied to the problem of identifying cell cycle-regulated genes in two different yeast species. As a result, improved sets of cell cycle-regulated genes were identified. The last part of dissertation explores the effectiveness of wavelet data transforms for the task of clustering. Discrete wavelet transforms, with an appropriate choice of wavelet bases, were shown to be effective in producing clusters that were biologically more meaningful. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Documents pertaining ot the establishment of teaching facilities, research labs, and a medical library for the College of Medicine.